PROBLEM SOLVING IN HEALTH CARING FORM DATA SYSTEM

VU HIEU NGUYEN

SOFTWARE ENGINEER

Back

#PROBLEM SOLVING IN HEALTH CARING FORM DATA SYSTEM

Avatar
Hieu Nguyen Vu - Oct 15, 2016  ·  6 min read

Overview about this project

For each time patients come to the health care facility, nurses have to take a survey to collect information on patients. In the medical examination, the doctors ask the patient about diseases, living conditions, habits, etc ... All these data are collected and help the patient for the follow-up visit. Overview, the government can collect medical data of, residential area, a province, and make a medical decision.

Store data in the client-side (browser-side)

One of my challenges is to store the patient data on the browser side. This means data store in the device of the care facility. We support storing data of amount 5000 users for a care facility, includes all sensitive data that can disclose patient information. => We using IndexedDB to store this large data.

On new browser like Chrome, user still can view sensitive data without login.

=> AES 256 is a good encoding solution for this case. When user login and need to view data, we decrypt it and display on screen.

If sensitive data was stored in the browser, how we can consistently all the data in all devices of a care facility?

=> We have a fucntion backup/restore, allow user export all sensitve data were encrypted, and import to another device. => I have to validate for this file, validate encoding, encrypt, data, version, etc..

Import function.

To support digital transformation, this system supports a function that allowed the user to import data of patients, and form data as a CSV file. Import the Form data is really a challenge for me, cause it up to 30 types, each type has a specific model and up to 200 field data for a model. The maximum is 50Mb per file** and **20 files for one-time import.

I. Validate for file data before send to server How did I solve it ? Split the task. I split the validation to four sub task.

  1. Validate file (encoding, file type, data type, number of column, column required, authority...)
  2. Validate fields of each record.
  3. Validate the relation of each field in a record (group value, required in a set, the value depending on another field...)
  4. Call API to validate some specific field in the server.

II. Send to server

  1. This Function using the Azure durable function for the back-end. That means I have to send a request to send data, send another request continuously until getting successful status (the back-end import completed).
  2. Bussiness logic allowed users to import multi-file with different data types at the same time, and some special logic required some file import before some file or not import at the same time with some file. => Required control many async tasks at the same time and keep business logic, save time for user. Look like not too challenge, sure? Keep it with the [1]-This Function using the Azure durable function.

IE11 / EDGE (old edge) browser support

IE11 and Edge are terible, but japaness people still using it, that's why we still support it.

  1. It's 32 bit. It's mean the memory of it stuck when reach 1,7Gb -> 1.8Gb and auto reload (or asking to reload)
  2. It's not share thread merory for each tab. It mean all the tab of this browser, service worker, etc.... are using ~1,7Gb only.
  3. CSS support is outdated. Some JS functions not work (but some of them can solve by polyfill).

These are not all! In this project I have many task, many function but these are some interest I want to share!

0
0